AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
25 billion parameter Transformer

# 25 billion parameter Transformer

Qwen2.5 Coder
MIT
Qwen2.5-Coder is an advanced language model developed by Qwen Labs, designed specifically for code generation, understanding, and completion tasks, using a Transformer architecture with 25 billion parameters.
Large Language Model
Q
cortexso
1,605
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase